专利摘要:
abstract ?self-position calculation apparatus and self-position calculation method? The present invention relates to a self-position calculator capable of accurately detecting a patterned beam of light projected onto a road surface, and capable of accurately calculating the self-position of a vehicle. the self-position calculating apparatus includes: a light projector (11) configured to project the patterned beam of light onto the road surface surrounding a vehicle; an image capture unit (12) configured to capture an image of an area over which the patterned light beam is projected; a patterned light beam extractor (21) configured to extract a patterned light beam position from the image; a steer angle calculator (22) configured to calculate a steer angle of the vehicle relative to the road surface from the position of the standardized light beam; an orientation change amount calculator (24) configured to calculate a vehicle orientation change amount based on temporal changes at multiple characteristic points on the road surface in the image; and an autoposition calculator (26) configured to calculate a current position and a current vehicle orientation angle by adding the amount of change in orientation to an initial position and an initial vehicle orientation angle. if a detected condition of the standardized light beam is equal to or greater than a threshold value, the standardized light beam extractor (21) extracts the position of the standardized light beam from a superimposed image by superimposing the images in frames obtained with the image capture unit (12).
公开号:BR112017002129B1
申请号:R112017002129-3
申请日:2014-08-04
公开日:2022-01-04
发明作者:Ichiro Yamaguchi;Hidekazu Nishiuchi;Norihisa Hiraizumi;Jun Matsumoto
申请人:Nissan Motor Co., Ltd;
IPC主号:
专利说明:

FIELD OF TECHNIQUE
[001] The present invention relates to an eigenposition calculation apparatus and an eigenposition calculation method. BACKGROUND OF THE INVENTION
[002] A technique has been conventionally known, in which: cameras installed in a vehicle capture and obtain images of the vehicle's surroundings; and an amount of vehicle movement is obtained based on changes in the images (see Patent Literature 1, for example). Patent Literature 1 aims to obtain the momentum of the vehicle precisely even if the vehicle moves slightly at low speed. For this purpose, a characteristic point is detected from each image; the position of the characteristic point is obtained; and thus the momentum of the vehicle is obtained from the direction and movement distance (momentum) of the characteristic point.
[003] In addition, a technique for performing a three-dimensional measurement using a laser beam projector to project a laser beam in a grid pattern (patterned light beam) has been known (see Patent Literature 2 , for example). According to Patent Literature 2, an image of an area of the projected patterned light beam is captured with a camera; the patterned light beam is extracted from the captured image; and a vehicle behavior is obtained from the position of the standardized light beam. CITATION LIST PATENT LITERATURE Patent Literature 1: Patent Application Publication in JP 2008175717 Patent Literature 2: Patent Application Publication in JP 2007278951 SUMMARY OF THE INVENTION TECHNIQUE PROBLEM
[004] In an outdoor environment, however, when a patterned light beam is projected onto a road surface, as described in Patent Literature 2, the patterned light beam is influenced by ambient light. For this reason, it is difficult to detect the patterned beam of light projected onto the road surface. SOLUTION TO THE PROBLEM
[005] The present invention was realized taking into account the problem mentioned above. An object of the present invention is to provide an eigenposition calculation apparatus and an eigenposition calculation method capable of: accurately detecting a patterned beam of light projected onto a road surface; and accurately calculate an autoposition of a vehicle.
[006] An autoposition calculation device. according to one aspect of the present invention, calculates a current position and current orientation angle of a vehicle by: projecting a patterned beam of light onto a road surface surrounding the vehicle from a light projector; taking an image of the road surface around the vehicle, including an area onto which the patterned light beam is projected, with an image capture unit; extracting a position of the patterned light beam from the image obtained with the image capture unit; calculating an orientation angle of the vehicle in relation to the road surface from the extracted position of the standardized light beam; calculating an amount of change in vehicle orientation based on temporal changes at multiple characteristic points on the road surface in the image obtained with the image capture unit; and adding the amount of change in orientation to an initial position and an initial orientation angle of the vehicle. In addition, if a detected condition of the standardized light beam is equal to or greater than a threshold value when the position of the standardized light beam is extracted, an overlay image is generated by superimposing the images in frames obtained with the image capture unit. image, and the position of the standardized light beam is extracted from the superimposed image. BRIEF DESCRIPTION OF THE DRAWINGS
[007] Figure 1 is a block diagram showing a general configuration of an autoposition calculating apparatus of a first embodiment.
[008]Figure 2 is an external view showing an example of how a light projector 11 and a camera 12 are installed in a vehicle 10.
[009]Figure 3(a) is a diagram showing how the positions of illuminated areas on a road surface 31 are calculated using a base length Lb between the light projector 11 and camera 12, as well as coordinates (Uj, Vj) of the directed lights in an image. Figure 3(b) is a schematic diagram showing how a direction of motion of camera 12 is obtained from temporal changes in a characteristic point detected from an area 33 different from the area over which a patterned beam of light 32a is designed.
[010] Figures 4(a) and 4(b) are a diagram showing an image of the standardized light beam 32a obtained with the camera 12, and subjected to a binarization process. Figure 4(a) is a diagram showing the entire patterned light beam 32a. Figure 4(b) is an enlarged diagram showing a directed light Sp. Figure 4(c) is a diagram showing positions of center of gravity He of the respective directed lights Sp extracted by a standardized light beam extractor 21.
[011]Figure 5 is a schematic diagram to explain a method for calculating an amount of change in a distance and an amount of change in an orientation angle.
[012]Figure 6(a) shows an example of a first frame (image) 38 taken at time t. Figure 6(b) shows a second frame 38' taken at time (t + Δt) which is a time duration Δt after time t.
[013]Figure 7(a) shows an amount of vehicle movement that is required to generate an overlay image when an external environment is bright. Figure 7(b) shows how to generate the superimposed image when the external environment is bright.
[014]Figure 8(a) shows an amount of vehicle movement that is required to generate an overlay image when the external environment is dark. Figure 8(b) shows how to generate the superimposed image when the external environment is dark.
[015]Figures 9(a) to 9(d) are timing graphs that respectively show a change in a change in a reset flag, a change in the number, a change in the number of images to be overlaid, a change in a condition under which feature points are detected, and a change in the number of associated feature points, in the self-position calculating apparatus of the first embodiment.
[016]Figure 10 is a flowchart showing an example of an autoposition calculation method using the autoposition calculation apparatus shown in Figure 1.
[017]Figure 11 is a flowchart showing a detailed procedure for step S18 shown in Figure 10.
[018] Figure 12 is a block diagram showing a general configuration of an autoposition calculating apparatus of a second embodiment.
[019]Figure 13 is a diagram to explain how to estimate an amount of change in a height of a road surface from a position of a standardized light beam in the second embodiment.
[020] Figures 14(a) to 14(e) are timing graphs that respectively show a change in a reset flag, a predetermined interval of step S201, a change in the number of images to be superimposed, a change in a road surface condition between a good condition and an unsatisfactory condition, and changes in the sizes of bumps (irregularity) of the road surface, in the autoposition calculator of the second mode.
[021] Figure 15 is a flowchart showing a process procedure for an eigenposition calculation process to be performed by the eigenposition calculation apparatus of the second modality.
[022] Figure 16 is a flowchart showing a detailed process procedure for step S28 shown in Figure 15 to be performed by the self-position calculating apparatus of the second mode.
[023] Figure 17 is a block diagram showing a general configuration of an autoposition calculating apparatus of the third embodiment.
[024] Figures 18(a) and 18(b) are timing graphs showing respectively a change in brightness and a change in a characteristic spot detection flag in the autoposition calculator of a third embodiment.
[025] Figures 19(a) to 19(c) are explanatory diagrams that show standardized light beams and characteristic points in the third modality autoposition calculator.
[026] Figures 20(a) to 20(d) are timing graphs that respectively show a change in a reset flag, a change in the timing at the end of each cycle, a change in the number of frequencies to be overlapped, and a change in light projection power in the self-position calculator of the third embodiment.
[027] Figure 21 is a flowchart showing a process procedure to be followed by the third modality autoposition calculation apparatus. DESCRIPTION OF MODALITIES
[028] Referring to the drawings, descriptions will be provided for the first to third modalities. In the descriptions of the drawings, the same components will be denoted by the same numeral references. Descriptions for such components will be omitted. First Mode Hardware Configuration
[029]Firstly, referring to Figure 1, descriptions will be provided for a hardware configuration of an autoposition calculating apparatus of a first embodiment. The self-position calculating apparatus includes a light projector 11, a camera 12 and an engine control unit (ECU) 13. The light projector 11 is installed in a vehicle, and projects a patterned beam of light onto a surface of road around the vehicle. Camera 12 is fitted to the vehicle, and is an example of an image capture unit configured to capture and thereby image the road surface around the vehicle, including an area of the projected patterned light beam. ECU 13 is an example of a controller configured to control light projector 11, and to perform a series of information processing cycles to estimate the amount of vehicle movement from images taken with camera 12.
[030]Camera 12 is a digital camera that uses a solid-state image sensor, such as a CCD and CMOS, and obtains processable digital images. What Camera 12 captures is the road surface around the vehicle. The road surface around the vehicle includes road surfaces at the front, rear, sides and underside of the vehicle. As shown in Figure 2, the camera 12 can be installed in a front section of the vehicle 10, more specifically above a front bumper, for example.
[031]The height and direction in which to set the camera 12 are adjusted in a way that allows the camera 12 to capture images of feature points (textures) on the road surface 31 in front of the vehicle 10 and the patterned light beam 32b is projected from the light projector 11. The focus and diaphragm of the camera lens 12 are also automatically adjusted. Camera 12 repeatedly captures images at predetermined time intervals and thereby obtains a series of image groups (frame). The image data obtained with camera 12 is transferred to ECU 13, and is stored in a memory included in ECU 13.
[032] As shown in Figure 2, the light projector 11 projects the patterned light beam 32b which has a predetermined shape, including a square or rectangular grid shape, onto the road surface 31 within an image capture range. of the camera 12. The camera 12 captures images of the patterned beam of light projected onto the road surface 31. The light projector 11 includes a laser pointer and a diffraction grating, for example. The diffraction grating diffracts the laser beam projected from the pointer. Thereby, as shown in Figures 2 to 4, the light projector 11 generates the patterned light beam (32b, 32a) which includes multiple directed lights Sp arranged in a grid or matrix pattern. In the examples shown in Figures 3 and 4, the light projector 11 generates the patterned light beam 32a which includes 5x7 Sp directed lights.
[033] Again referring to Figure 1, ECU 13 includes a CPU, a memory, and a microcontroller that includes an input-output section. By executing pre-installed computer programs, the ECU 13 forms multiple information processors included in the self-position calculator. For each image (frame), the ECU 13 repeatedly performs a series of information processing cycles to calculate the vehicle's autoposition from the images obtained with the camera 12. Consequently, the ECU 13 can also be used as an ECU to control other vehicle-related systems 10.
[034]Multiple information processors include a standardized light beam extractor (superimposed image generator) 21, an orientation angle calculator 22, a feature point detector 23, an orientation change amount calculator 24, a brightness determination section (standard light beam detection condition determination section) 25, an autoposition calculator 26, a standardized light beam controller 27, a detection condition determination section 28 and a determination section calculation status 29. The characteristic point detector 23 may be included in the orientation change amount calculator 24.
[035]The patterned light beam extractor 21 reads an image taken with the camera 12 from memory, and extracts the position of the patterned light beam from the image. For example, as shown in Figure 3(a), the light projector 11 projects the patterned light beam 32a, which includes the multiple directional lights arranged in a matrix pattern, onto the road surface 31 while the camera 12 detects the patterned light beam 32a reflected off the road surface 31. The patterned light beam extractor 21 applies a process of binarization to the image obtained with the camera 12 and thereby extracts only an image of the directed lights Sp as shown in Figures 4(a) and 4(b). As shown in Figure 4(c), the patterned light beam extractor 21 extracts the position of the patterned light beam 32a by calculating the center of gravity portion He of each directed light Sp, i.e. the coordinates (Uj, Vj) of each directed light Sp in the image. The coordinates are expressed using the number assigned to a corresponding pixel on the camera's image sensor 12. In a case where the standardized light beam includes 5x7 Sp directed lights, “j” is an integer not less than 1, however, not greater than 35. The memory stores the coordinates (Uj, Vj) of the directed light Sp in the image as data about the position of the patterned light beam 32a.
[036]The bearing angle calculator 22 reads the data about the position of the patterned light beam 32a from the memory, and calculates the distance and the bearing angle of the vehicle 10 with respect to the road surface 31 from the position of the standardized light beam 32a in the image obtained with the camera 12. For example, as shown in Figure 3(a), using the trigonometric measurement principle, the orientation angle calculator 22 calculates the position of each illuminated area on the surface of a road 31, as the position of each illuminated area with respect to camera 12, from a base length Lb between light projector 11 and camera 12, as well as the coordinates (Uj, Vj) of each light directed in the image . Thereafter, the bearing angle calculator 22 calculates a plane equation of the road surface 31 on which the patterned light beam 32a is projected, i.e. the distance and the bearing angle (normal vector) of the camera 12 with respect to to the road surface 31, from the position of each directed light with respect to the camera 12. It should be noted that in the embodiment, the distance and the orientation angle of the camera 12 relative to the road surface 31 are calculated as an example of the distance and orientation angle of the vehicle 10 in relation to the road surface 31 since the installation position of the camera 12 on the vehicle 10 and the angle for the camera 12 to capture the images are already known. Henceforth, the distance and angle of orientation of camera 12 with respect to road surface 31 will be shortened to "distance and angle of orientation". The orientation distance and angle calculated by the orientation angle calculator 22 are stored in memory.
[037] Specifically, once the camera 12 and the light projector 11 are attached to the vehicle 10, the direction in which to project is the standardized light beam 32a and the distance (the base length Lb) between the camera 12 and the light projector 11 are already known. For that reason, using the trigonometric measurement principle, the bearing angle calculator 22 is able to obtain the position of each illuminated area on the road surface 31, as the position (Xj, Yj, Zj) of each illuminated area. with respect to camera 12, from the coordinates (Uj, Vj) of each light directed in the image.
[038]It should be noted that in many cases the position (Xj, Yj, Zj) of each light directed in relation to camera 12 is not present in the same plane. This is because the relative position of each directed light changes according to the roughness of the asphalt of the road surface 31. For this reason, the least squares method can be used to obtain a plane equation that minimizes the sum of the distance difference. of each directed light.
[039]Feature spot detector 23 reads the image obtained with camera 12 from memory, and detects feature points on road surface 31 from the image read from memory. In order to detect feature points on the road surface 31, feature spot detector 23 may use a method described in “D. G. Lowe, "Distinctive Image Features from Scale-Invariant Keypoints", Int. J. Comput. Vis., vol. 60, no. 2, pages 91 to 110, November 200”. Otherwise, a method described in “ Kanazawa Yasushi, Kanatani Kenichi, “Detection of Feature Points for Computer Vision”, IEICE Journal, vol. 87, no 12, pages 1043 to 1048, December 2004” can be used.
[040] Specifically, for example, the characteristic point detector 23 uses the Harris operator or the SUSAN operator as those points, such as vertices of an object, whose luminance values are very different from those in the vicinity of the points are detected as the characteristic points. Instead, however, the feature point detector 23 can use an amount of SIFT (Invariant Characteristic Scale Transform) feature so that the points around which the luminance values change with some regularity are detected as the feature points. After detecting the feature points, the feature point detector 23 counts the total number N of feature points detected from an image, and assigns identification numbers (i (1 < i < N)) to the respective feature points. The position (Ui, Vi) of each feature point in the image is stored in memory within the ECU 13. Figures 6(a) and 6(b) each show examples of the feature points Te that are detected from the image. obtained with camera 12. The position (Ui, Vi) of each feature point in the image is stored in memory.
[041] It should be noted that the present embodiment treats asphalt mixture particles with a particle size not less than 1 cm but not greater than 2 cm as the characteristic points on the road surface 31. Camera 12 employs the VGA resolution mode (approximately 300 thousand pixels) in order to detect the characteristic points. Furthermore, the distance from the camera 12 to the road surface 31 is approximately 70 cm. Furthermore, the direction in which the camera 12 captures images is tilted approximately 45 degrees with respect to the road surface 31 of the horizontal plane. Furthermore, the luminance value of each image taken with camera 12 and later sent to ECU 13 is within a range of 0 to 255 (0: darkest, 255: brightest).
[042]The orientation change amount calculator 24 selects a previous frame and a current frame from the captured frames in the information processing cycle; and reads the positions (Ui, Vi) of multiple feature points in an image in the previous frame, and the positions (Ui, Vi) of multiple feature points in an image in the current frame, from memory. Thereafter, based on changes in the positions of multiple feature points in the image, the orientation change amount calculator 24 obtains an orientation change amount of the vehicle. In this regard, the “amount of change in vehicle orientation” includes both amounts of changes in “distance and angle of orientation” relative to the road surface 31, and an “amount of vehicle movement (the camera 12)” on the surface. of road. Descriptions will be given hereafter on how to calculate the amounts of changes in the distance and angle of orientation and in the amount of movement of the vehicle.
[043]Figure 6(a) shows an example of a first frame (image) 38 taken at time t. A case is assumed, as shown in Figures 5 and 6(a), in which the relative positions (Xi, Yi, Zi) of three characteristic points Te1, Te2, Te3 are calculated in the first frame 38, for example. In this case, a plane G defined by the characteristic points Te1, Te2, Te3 can be considered as the road surface. Consequently, the orientation change amount calculator 24 is able to obtain the distance and orientation angle (normal vector) of camera 12 with respect to the road surface (plane G) from the relative positions (Xi, Yi , Zi). Furthermore, using known camera models, the orientation change amount calculator 24 is able to obtain a distance l1 between the characteristic points Te1, Te2, a distance l2 between the characteristic points Te2, Te3 and a distance l3 between the characteristic points Te3, Te1, as well as an angle between a straight line joining the characteristic points Te1, Te2 and a straight line joining the characteristic points Te2, Te3, an angle between the straight line joining the characteristic points Te2, Te3 and a straight line joining the characteristic points Te3, Te1, and an angle between the straight line joining the characteristic points Te3, Te1 and the straight line joining the characteristic points Te1, Te2. Camera 12 in Figure 5 shows where the camera is situated when the camera takes the first frame.
[044] It should be noted that the three-dimensional coordinates (Xi, Yi, Zi) of the position relative to camera 12 are defined in such a way that: the geometric axis Z coincides with the direction in which camera 12 captures the image; and the X and Y axes orthogonal to each other in a plane that includes camera 12 are lines normal to the direction in which camera 12 captures the image. However, the coordinates in image 38 are defined so that: the geometric axis V coincides with the horizontal direction; and the geometric axis U coincides with the vertical direction.
[045]Figure 6(b) shows a second frame taken at time (t + Δt) in which the time duration Δt passed from time t. A camera 12' in Figure 5 shows where the camera is situated when the camera captures the second frame 38'. As shown in Figures 5 and 6(b), camera 12' captures an image that includes feature points Te1, Te2, Te3 as the second frame 38', and feature point detector 23 detects feature points Te1, Te2, Te3 of the image. In this case, the orientation change amount calculator 24 is capable of calculating not only an amount ΔL of camera movement 12 in the time interval Δt, but also amounts of change in distance and orientation angle from: the position relative (Xi, Yi, Zi) of each of the characteristic points Te1, Te2, Te3 at time t; a P1 position (Ui, Vi) of each feature point in the second frame 38'; and the camera model of camera 12. For example, the orientation change amount calculator 24 is capable of calculating the amount (ΔL) of camera 12 (the vehicle) movement, and the amounts of change in distance and angle of view. orientation of camera 12 (the vehicle) by solving the following system of simultaneous equations (1) to (4). Consequently, Equation (1) is based on camera 12 which is modeled as an ideal pinhole camera free from deformation and optical axial misalignment where Δi and f respectively denote a constant and a focal length. Camera model parameters can be calibrated in advance.

[046] Figure 3(b) schematically shows how a movement direction 34 of camera 12 is obtained from temporal changes in a characteristic point detected from an area 33 in an image capture range of camera 12, which is different from an area over which the patterned light beam 32a is projected. Figures 6(a) and 6(b) show the vectors Dte that respectively represent the directions and amounts of changes in the positions of the characteristic points Te, and which are superimposed on an image. The orientation change amount calculator 24 is capable of calculating not only the amount (ΔL) of movement of the camera 12 in a time duration Δt, but also the amounts of change in the distance and angle of orientation of the camera 12 over time. Same time. For these reasons, taking the amount of change in distance and angle of orientation into account, the amount of change of orientation calculator 24 is able to accurately calculate the amount (ΔL) of motion in six degrees of freedom. In other words, an error in the estimation of the amount (ΔL) of movement can be minimized even if the distance and orientation angle are changed by rolling or pitching due to a turning, acceleration or deceleration of the vehicle 10.
[047] It should be noted that instead of using all feature points whose relative positions are calculated, the orientation change amount calculator 24 can select ideal feature points based on the positional relationships between the feature points. An example of a selection method usable for this purpose is epipolar geometry (the epipolar line geometry described in RI Hartley, “A linear method for reconstruction from lines and points”, Proc. 5th International Conference on Computer Vision, Cambridge, Massachusetts , pages 882 to 887 (1995)).
[048]The association of feature points in the current frame with feature points in the previous frame can be achieved, for example, by: storing an image of an area including and approximately each feature point detected in memory; and determine whether the feature points in the current frame and the feature points in the previous frame can be associated with each other from the similarity in brightness information and color information between the feature points in the current frame and the feature points in the previous frame. Specifically, the ECU 13 stores an image of 5(horizontal) x 5(vertical) pixels of approximately each detected feature point in memory. If in 20 or more pixels of each 5(horizontal) x 5(vertical) pixel image, for example, the difference in brightness information between the corresponding feature point in the current frame and the corresponding feature point in the previous frame is equal to or less than 1%, the orientation change amount calculator 24 determines that the feature point in the current frame and the feature point in the previous frame can be associated with each other.
[049]When, as in this case, the characteristic points Te1, Te2, Te3 whose relative positions (Xi, Yi, Zi) are calculated are detected from a 38' image obtained in a subsequent timing as well, the Orientation change 24 is able to calculate the "amount of change in vehicle orientation" based on temporal changes at multiple characteristic points on the road surface.
[050]The autoposition calculator 26 calculates the distance and orientation angle from the “amounts of change in distance and orientation angle” calculated by the amount of orientation change calculator 24. In addition, the autoposition calculator 26 calculates the vehicle's current position from the "vehicle movement amount" calculated by the orientation change amount calculator 24.
[051]Specifically, in a case where the distance and the bearing angle calculated by the bearing angle calculator 22 (see Figure 1) are set as starting points for the calculation of the autoposition calculator 26 for the distance and orientation angle, the autoposition calculator 26 updates the orientation distance and angle with the most recent numerical values by sequentially adding (performing an integration operation on) the amounts of change in distance and orientation angle calculated for each frame by the orientation change amount calculator 24 to the starting points (distance and orientation angle). Also, in a case where the vehicle position that is obtained when the steer angle calculator 22 calculates the distance and the steer angle is set as a starting point (an initial vehicle position) for the calculator's calculation from autoposition 26 to the vehicle's current position, the autoposition calculator 26 calculates the vehicle's current position by sequentially adding (performing an integration operation on) the vehicle movement amounts to the home position. For example, when the starting point (the vehicle's starting position) is set to match the vehicle's position on a map, the autoposition calculator 26 is able to sequentially calculate the vehicle's current position on the map.
[052] In a case where the feature point detector 23 may continue to detect three or more feature points that can be associated between the previous and current frames, as discussed above, the continuation of the process (integration operation o) of adding the amount of change in distance and orientation angle allows the autoposition calculator 26 to continue updating the distance and orientation angle with the latest numerical values without using the standardized light beam 32a. However, the distance and orientation angle calculated using the standardized light beam 32a, or a predetermined initial distance and predetermined initial orientation angle, can be used for the first information processing cycle. In other words, the distance and orientation angle that are the starting points for the integration operation can be calculated using the standardized light beam 32a, or they can be set to the predetermined initial values. It is desirable that the predetermined initial distance and predetermined initial orientation angle are a determined distance and orientation angle, with at least the occupants and payload of the vehicle 10 being taken into account. For example, the distance and orientation angle calculated using the standardized light beam 32a that is projected while the vehicle ignition key 10 is on and when the shift position is moved from park to another position can be used as the default starting distance and default starting orientation angle. In this way, it is possible to obtain the distance and orientation angle that are not affected by the roll or pitch of the vehicle 10 due to the turning, acceleration or deceleration of the vehicle 10.
[053]The modality updates the orientation distance and angle with the most recent numerical values by: calculating the amounts of changes in the orientation distance and angle; and sequentially adding the amounts of changes calculated in this way in distance and orientation angle. Instead, however, the amount of a change in just the orientation angle of camera 12 relative to road surface 31 can be calculated and updated. In this case, it can be assumed that the distance from camera 12 to road surface 31 remains constant. This makes it possible to reduce the operating load on the ECU 13 while minimizing the error in estimating the amount (ΔL) of movement, taking the amount of change in orientation angle into account, and increasing the operating speed of the ECU 13.
[054]The detection condition determination section 28 determines whether a condition in which the feature point detector 23 detects the feature points Te is too unsatisfactory to satisfy a first criterion or not. For example, if like a concrete pavement inside a tunnel, the road surface is less patterned and almost uniform with asphalt mixture particles, the characteristic points detectable from an image of the road surface decrease in number. The reduced number of detectable feature points makes it difficult to continuously detect feature points that are associated between the previous and current frames, and reduces the accuracy with which the distance and orientation angle are updated.
[055] As a measure against this problem, the detection condition determination section 28 determines that the condition under which the characteristic point detector 23 detects the characteristic points Te is too unsatisfactory to satisfy the first criterion, if, for example , the number of feature points, whose positions with respect to the camera 12 are calculated and can be detected from an image obtained in the subsequent information processing cycle, is equal to or less than a predetermined threshold value (three, for example). In other words, if four or more characteristic points associated between the previous and current frames cannot be detected, the detection condition determination section 28 determines that the condition under which the characteristic points Te are detected is too unsatisfactory to satisfy the first criterion. Consequently, at least three characteristic points associated between the previous and current frames are needed to obtain the amounts of change in distance and orientation angle. This is due to the fact that three feature points are needed to define the G plane. Since more feature points are needed to increase the estimation precision, it is desirable that the default threshold value be four, five or more.
[056]If the detection condition determination section 28 determines that the condition under which the characteristic points are detected satisfies the first criterion, the eigenposition calculator 26 retains the starting points for integration operations as they are. On the other hand, if the detection condition determination section 28 determines that the condition under which the characteristic points are detected is too unsatisfactory to satisfy the first criterion, the eigenposition calculator 26 resets the starting points for the integration operations. (the steer angle and the initial position of the vehicle) on the distance and steer angle calculated by the steer angle calculator 22 (see Figure 1) in the same information processing cycle, and on the vehicle position obtained at the time of calculation. Thereafter, the autoposition calculator 26 begins to add the amount of change in vehicle orientation to the starting points thus defined.
[057] In the first embodiment, based on the number of characteristic points associated between the previous and current frames, the detection condition determination section 28 determines under which condition the characteristic points are detected. Instead, however, it should be noted that the detection condition determination section 28 can be configured so that, based on the total number N of characteristic points detected from an image, the detection condition determination section detection 28 determines under which condition characteristic points are detected. Specifically, the configuration may be such that if the total number N of feature points detected from an image is equal to or less than a predetermined threshold value (9, for example), the detection condition determination section 28 determines that the condition under which the characteristic points are detected is too unsatisfactory to satisfy the first criterion. A threshold value for the total number N can be set to a numerical value (9) three times the predetermined threshold value (3) because there is a probability that some of the detected characteristic points cannot be associated between the previous and current frames.
[058]The calculation state determination section 29 determines whether a distance and orientation angle calculation state by the orientation angle calculator 22 is too unsatisfactory to satisfy the second criterion or not. For example, in a case where the patterned light beam is projected onto an overhang on the road surface 31, the accuracy of calculating the distance and bearing angle decreases significantly because the overhang on the road surface 31 is larger than teeth and asphalt pavement projections. If the condition under which the characteristic points are detected is too unsatisfactory to satisfy the first criterion and, at the same time, if the calculation state of distance and orientation angle is too unsatisfactory to satisfy the second criterion, there would otherwise be no , means for accurately detecting the distance and angle of orientation, as well as the amounts of change in distance and angle of orientation.
[059] Taking this into account, the calculation state determination section 29 determines that the state of calculation of distance and orientation angle by the orientation angle calculator 22 is too unsatisfactory to satisfy the second criterion, if deviations distance and orientation angle patterns calculated by the orientation angle calculator 22 are greater than the predetermined threshold values. Furthermore, if the number of directional lights detected among the 35 directional lights is less than three, the calculation state determination section 29 determines that the state of calculation of distance and orientation angle by the orientation angle calculator 22 is very unsatisfactory to satisfy the second criterion, since theoretically, the road surface plane equation 31 cannot be obtained from such a small number of detected directional lights. In a case where the plane equation is obtained using the least squares method, if an absolute value of the maximum value between the differences between the directed lights and the plane obtained by the plane equation is equal to or greater than a certain value threshold (0.05 m, for example), the calculation status determining section 29 may determine that the status of distance and bearing angle calculation by the bearing angle calculator 22 is too unsatisfactory to satisfy the second criterion.
[060]If the detection condition determination section 28 determines that the condition under which the characteristic points are detected is too unsatisfactory to satisfy the first criterion and, simultaneously, if the calculation state determination section 29 determines that the The state of calculation of the distance and orientation angle by the orientation angle calculator 22 is too unsatisfactory to satisfy the second criterion, the autoposition calculator 26 uses the distance and orientation angle obtained in the previous information processing cycle, as well as the current position of the vehicle, as the starting points for integration operations. This makes it possible to minimize an error in calculating the vehicle's momentum.
[061]The patterned light beam controller 27 controls the projection of the patterned light beam 32a by the light projector 11. For example, after the vehicle ignition key 10 is turned on, once the self-position calculator becomes activated, the patterned light beam controller 27 starts projecting the patterned light beam 32a. Thereafter, until the autoposition calculating apparatus stops its operation, the patterned light beam controller 27 continues projecting the patterned light beam 32a. Otherwise, the standardized light beam controller 27 can be configured to alternately turn the light projection on and off at predetermined intervals. Instead, the patterned light beam controller 27 can be configured to temporarily project the patterned light beam 32a only when the detection condition determining section 28 determines that the condition under which the characteristic points Te are detected is very unsatisfactory to satisfy the first criterion.
[062]The brightness determination section (standardized light beam detection condition section) 25 determines whether a detected condition of the standardized light beam obtained with the camera 12 is equal to or greater than a predetermined threshold value or not . For example, the brightness determination section 25 determines whether the average brightness value (the detected condition of the standardized light beam) of an image taken with the camera 12 is equal to or greater than a B-th threshold surface brightness value. (default threshold value) or not. Accordingly, the brightness determining section 25 can determine whether the illumination in an image taken with the camera 12 is equal to or greater than a threshold value or not. Furthermore, instead of the brightness determining section 25, a luminance sensor can be installed in the vehicle.
[063]The B-th threshold value of road surface gloss can be obtained in advance using the following procedure, for example. Firstly, after the vehicle is placed in an empty state without people, luggage, fuel, etc. in the vehicle, an image of an asphalt-paved road surface is captured with camera 12 projecting a patterned beam of light onto the road surface. In this case, arrangements are made so that the brightness of the asphalt-paved road surface in the image can be substantially uniform. For example, the road surface image is captured with the light environment adjusted so that 95% of the pixels that do not represent any standardized light beam have brightness within 20 scale points of the average brightness value. Consequently, the representations assume that the brightness values of the images taken with camera 12 are within a range of 0 to 255 (where 0 represents the darkest brightness, and 255 represents the brightest brightness). Subsequently, in the obtained image, an average brightness value Bp of the pixels representing the standardized light beam and an average brightness value Ba of the pixels representing the road surface paved with asphalt different from the standardized light beam are compared with each other. others. This series of processes is repeated at the beginning using the average brightness value Ba of the pixels representing the asphalt-paved road surface, and subsequently using the values obtained by adding 10 scale points to the average brightness value Ba every time the series of processes is repeated. The B-th threshold brightness value is defined as the mean brightness value thus increased Ba that satisfies Bp x 0.7 < Ba. In other words, the B-th threshold brightness value is defined as a value of the brightness of the asphalt-paved road surface that is approximately 30% of the brightness of the standardized light beam.
[064]When the brightness determination section 25 determines that the detected condition of the patterned light beam is equal to or greater than the predetermined threshold value, the patterned light beam extractor 21 superimposes a predetermined number of images onto successive selected frames at from the images taken with the camera 12. Accordingly, the following descriptions will be given on the assumption that the images in successive frames stored in memory between the past and the present are used as the images to be superimposed. However, the images to be superimposed may include images that the camera 12 will take between the present and the future. Thereafter, the patterned light beam extractor 21 extracts the position of the patterned light beam from the generated superimposed image. The steering angle calculator 22 can calculate the vehicle's steering angle relative to the road surface using the position of the patterned light beam extracted from the superimposed image. However, the autoposition calculator 26 can start adding the amount of change in orientation by setting the vehicle's start position and orientation angle (the starting points), respectively, at the vehicle's current position at that time and the angle of departure. vehicle orientation calculated from the superimposed image.
[065] In this aspect, the standardized light beam extractor 21 defines the predetermined number of images to be superimposed, depending on the detected condition of the standardized light beam in the image obtained with the camera 12. The detected condition of the standardized light beam can be represented by a ratio (S/N ratio) between the brightness value of the standardized light beam and the brightness value of the ambient light, for example. The patterned light beam extractor 21 increases the number of images to be superimposed as the S/N ratio becomes smaller (as the external environment becomes brighter).
[066] As shown in Figure 7(a), due to the fact that a relatively large number of images are required in a relatively bright outdoor environment, the amount of vehicle movement required to obtain a group of images is relatively large. Furthermore, as shown in Figure 7(b), the relatively large number of images I1 are superimposed to generate an overlapping image I2. On the other hand, as shown in Figure 8(a), due to the fact that a relatively small number of images are needed in a relatively dark outdoor environment, the amount of vehicle movement required to obtain a group of images is relatively small. Furthermore, as shown in Figure 8(b), the relatively small number of images I1 are superimposed to generate an overlapped image I2.
[067]The number of images that need to be superimposed in order for the patterned light beam extractor 21 to extract the patterned light beam can be set using the following procedure, for example. To begin with, for each of the original average brightness value Ba of the pixels representing the asphalt-paved road surface and the average brightness values Ba obtained by sequentially adding 10 scale points as when the B-th threshold value of brightness is obtained, Rap = Ba/Bp, or a ratio between the average brightness value Ba and the average brightness value Bp of the pixels representing the standardized light beam is obtained in advance through experiments or the like, and is stored in memory of the ECU. Subsequently, the actual control is performed by: rounding the average brightness value of the images obtained with camera 12 to the nearest decimal; get the Rap ratio between S/N using the nearest decimal; obtain Sn using Equation (5) given below referring to the Rap ratio between S/N obtained in this way; rounding Sn thus obtained to the nearest whole number; and setting the number of images needed to extract the standardized light beam to the nearest integer:

[068] In other words, when the average brightness value Ba of the asphalt-paved road surface is approximately 29% or less of the average brightness value Bp of the standardized light beam, the number of images required to extract the light beam default is set to 1. When the average brightness value Ba is 75% of the average brightness value Bp, the number of images required is set to 8. When the average brightness value Ba is 90% or more of the average brightness value Bp, the number of images required is set to 50. Consequently, since a part representing the area over which the standardized light beam is projected is sufficiently small compared to the total of each image, the average brightness value of the entire image can be calculated. Otherwise, the number of images needed can be set to a number that makes a success rate of extracting the targeted lights from the given standardized light beam become 95% or greater by actually overlapping the images to extract the targeted lights. in an experiment to obtain Rap ratios using the average brightness value Ba and the values obtained by sequentially adding 10 scale points to the average brightness value Ba.
[069] However, as described below, in a case where the standardized light beam extractor 21 cannot generate (has difficulty generating) the superimposed image, the distance and orientation angle employed for the information processing cycle (hereinafter also referred to simply as “previous values”), or the predetermined initial distance and predetermined initial orientation angle of the vehicle (also referred to simply as “initial values”) are used as the starting points.
[070] First, there are cases where an excessively small S/N ratio (an excessive brightness) greatly increases the time required for the integration operation. For example, when the Rap ratio obtained by referring to the nearest decimal to which the average brightness value of images taken with camera 12 is rounded to 0.95 or greater, or when the average brightness value of the paved road surface with asphalt is 90% or more of the average brightness value of the standardized light beam, a lot of time is required to generate an overlay image. In this case, it is determined that the assumption that the road surface changes slightly while an overlay image is being generated becomes unrealistic, or that it is theoretically difficult to extract the standardized light beam. In this way, the previous values or the initial values are used as the starting points.
[071] Second, there are cases where the vehicle runs too much (vehicle speed is too fast). For example, in a case where the vehicle moves a distance greater than 0.2 [m] while a defined number of images to be superimposed are being captured, the assumption that the road surface changes slightly is determined to be makes it unreal; and no overlay images can be generated. In this respect, in the case where the average brightness value Ba is 75% of the average brightness value Bp, the number of images needed to be taken with a 1000 fps camera to generate an overlay image is 8. For this reason, if the vehicle speed is equal to or greater than 90 km/h, which is obtained using Equation (6) given below, the previous values or the initial values are used as the starting points:

[072] Third, there are cases where the change in the road surface (bulges or unevenness) is very large. For example, it determines whether the road surface condition around the vehicle changes to an extent equal to or greater than a threshold value while a defined number of images to be superimposed are being captured. If more than 5% of the images obtained in this way show that the road surface condition around the vehicle changes to the extent equal to or greater than the threshold value, it is determined that the assumption that the road surface changes slightly becomes invalid. In this way, the previous values or the initial values are used as the starting points. Consequently, how to determine the change in the road surface will be described in a second embodiment in detail.
[073] Figures 9(a) to 9(d) respectively show a change in a reset flag, a change in the number of images to be superimposed, a change in the condition under which characteristic points are detected, and a change in the number of associated characteristic points, in the self-position calculator of the first modality. For example, in information processing cycles at time t11 and time t12, once the number of associated characteristic points becomes equal to or less than 3, as shown in Figure 9(d), it is determined that the condition under the which characteristic points are detected becomes unsatisfactory, as shown in Figure 9(c). In response to this, the reset flag is set to “1”, as shown in Figure 9(a).
[074]In the information processing cycle at time t11, as shown in Figure 9(b), the standardized light beam extractor 21 sets the number of images to be superimposed to 1 (or decides to superimpose no images) from the average brightness value of the images obtained with the camera 12. In this way, the patterned light beam extractor 21 extracts the patterned light beam from the single image obtained in the current information processing cycle at time t11.
[075] However, in the information processing cycle at time t12, as shown in Figure 9(b), the standardized light beam extractor 21 sets the number of images to be superimposed at 2 from the average brightness value of the images obtained with camera 12. Furthermore, the standardized light beam extractor 21 generates an overlapping image by superimposing two images, that is, an image obtained with camera 12 at time t12 and an image obtained with camera 12 at previous information processing cycle (adding up the brightness values). Thereby, the patterned light beam extractor 21 extracts the patterned light beam from the superimposed image. Information Process Cycle
[076]Following, as an example of an autoposition calculation method for estimating the amount of movement of vehicle 10 of image 38 obtained with camera 12, the information processing cycle to be repeatedly performed by ECU 13 will be described with reference to Figures 10 and 11. The information process cycle shown in a flowchart of Figure 10 is initiated at the same time as the self-position calculating apparatus becomes activated after the vehicle ignition key 10 is turned on, and is repeatedly performed until the self-position calculator stops operating.
[077] In step S01 in Figure 10, the patterned light beam controller 27 controls the light projector 11 to cause the light projector 11 to project the patterned light beam 32a onto the road surface 31. Using In the flowchart in Figure 10, descriptions will be provided for the case where the patterned light beam 32a is continuously projected.
[078] Proceeding to step S11, ECU 13 controls camera 12 to obtain image 38 causing camera 12 to shoot at road surface 31 surrounding vehicle 10, inclusive of an area of projected patterned light beam 32a . ECU 13 stores data about the image taken with camera 12 in memory. It should be noted that ECU 13 has the ability to automatically control the aperture of the camera 12. The ECU 13 can be configured to perform feedback control of the aperture of the camera 12 so that a brightness value of the next image becomes equal. to an average value between the maximum and minimum values according to an average of the brightness of the image 38 obtained in the previous information processing cycle. Otherwise, using the brightness value of the area of the projected standardized light beam 32a, the ECU 13 can obtain an average value of the brightness of the previously obtained image 38 from an area outside a part from which the beam of Patterned light 32a is extracted.
[079] Proceeding to step S12, the brightness determination section 25 reads the image 38 obtained with the camera 12 from memory, and determines whether the average brightness of the image is less than the B-th threshold brightness value of road surface. If the brightness determination section 25 determines that the average image brightness is less than the B-th threshold value of road surface brightness, the ECU proceeds to step S15.
[080] Proceeding to step S15, to begin with, the patterned light beam extractor 21 reads the image 38 obtained with the camera 12 from the memory, and extracts the position of the patterned light beam 32a from the image 38, as shown in Figure 4(c). The patterned light beam extractor 21 stores the coordinates (Uj, Vj) of the directed lights Sp in the image, which are calculated as the data at the position of the patterned light beam 32a, in memory.
[081] On the other hand, if it is determined in step S12 that the average image brightness is equal to or greater than the B-th threshold value of road surface brightness, the ECU proceeds to step S13. In step S13, the standardized light beam extractor 21 sets the number of images (the number of frames) needed to extract the standardized light beam from the average brightness of the image obtained with the camera 12.
[082] In step S14, the standardized light beam extractor 21 reads the image 38 obtained with the camera 12 from memory, and superimposes the defined number of images in successive frames by summing the brightness values) to generate an overlay image . Furthermore, the patterned light beam extractor 21 extracts the position of the patterned light beam 32a from the superimposed image generated thereby. The patterned light beam extractor 21 stores the coordinates (Uj, Vj) of the directed lights Sp in the image, which are calculated as the data about the position of the patterned light beam 32a, in memory.
[083] In step S16, the guidance angle calculator 22 reads the data about the position of the standardized light beam 32a extracted in step S14 or S15 from the memory, calculates the distance and the orientation angle from the position of the patterned light beam 32a, and stores the distance and orientation angle so calculated in memory.
[084] Proceeding to step S17, ECU 13 detects feature points from image 38, and extracts feature points that can be associated between previous and current information process cycles. From the positions (Ui, Vi) of the characteristic points associated in this way in the image, the ECU 13 calculates the amounts of changes in the distance and in the orientation angle, as well as the amount of movement of the vehicle.
[085]Specifically, to begin with, the feature point detector 23 reads the image 38 obtained with the camera 12 from memory, detects feature points on the road surface 31 of the image 38, and stores the positions (Ui, Vi ) of the characteristic points thus detected on the image in memory. The orientation change amount calculator 24 reads the positions (Ui,Vi) of the respective feature points in the image from memory. From the orientation distance and angle, as well as the positions (Ui, Vi) of the feature points in the image, the orientation change amount calculator 24 calculates the relative positions (Xi, Yi, Zi) of the feature points with respect to to camera 12. Accordingly, orientation change amount calculator 24 uses the distance and orientation angle set in step S16 in the previous information processing cycle. Orientation change amount calculator 24 stores the relative positions (Xi, Yi, Zi) of feature points with respect to camera 12 in memory.
[086] Subsequently, the orientation change amount calculator 24 stores the positions (Ui, Vi) of the characteristic points in the image, and the relative positions (Xi, Yi, Zi) of the characteristic points in relation to the camera 12 calculated in step S17 in the previous information process cycle, from memory. Using the relative positions (Xi, Yi, Zi) of the characteristic points associated between the previous and current information processing cycles, as well as the positions (Ui, Vi) of the characteristic points associated in this way in the image, the quantity calculator Orientation Change 24 calculates the amounts of change in orientation distance and angle. Furthermore, using the relative positions (Xi, Yi, Zi) of the characteristic points calculated in the previous information processing cycle and the relative positions (Xi, Yi, Zi) of the characteristic points calculated in the current information processing cycle, the calculator orientation change amount 24 calculates the amount of movement of the vehicle. The “Amounts of change in distance and steering angle” and “Amount of vehicle movement” are used in the process in step S19.
[087] Proceeding to step S18, the ECU 13 defines the starting points for the integration operations, depending on the condition under which the characteristic points are detected, and the state of calculation of distance and orientation angle using the beam standardized light. Detailed descriptions for this will be provided later with reference to Figure 11.
[088] Proceeding to step S19, the autoposition calculator 26 calculates the current position of the vehicle from the starting points for the integration operations defined in the process in step S18, and the amount of movement of the vehicle calculated in the process in step S17.
[089] The modality self-position calculator may eventually calculate the current position of the vehicle 10 by summing the amount of movement of the vehicle while repeatedly performing the series of information processing cycles described above.
[090] Referring to a flowchart in Figure 11, detailed descriptions will be provided for a procedure for step S18 in Figure 10. At step S100, ECU 13 determines whether the current information process cycle is a cycle performed by the first time or not. If the current information process cycle is performed for the first time, or if there is no data in the previous information process cycle, ECU 13 proceeds to step S104. If the current information process cycle is not a cycle performed for the first time time, ECU 13 proceeds to step S101.
[091]In step S101, the detection condition determination section 28 determines whether the condition under which the feature point detector 23 detects the feature points Te is too unsatisfactory to satisfy the first criterion or not. If detection condition determination section 28 determines that the condition is unsatisfactory (if YES in step S101), ECU 13 proceeds to step S102. If detection condition determination section 28 determines that the condition is not unsatisfactory (if NOT at step S101), ECU 13 proceeds to step S106. In step S106, the ECU 13 retains the starting points for the integration operations as they are.
[092]In step S102, ECU 13 determines whether an overlay image has already been generated. Examples of the case where no overlay image has yet been generated include a case where the overlay image is still in the process of being generated because it takes too long to get the predetermined number of images which includes images to be taken in the future, and a case where that it is theoretically impossible or difficult to generate an overlay image. If it determines that no overlay image has been generated yet (if NOT at step S102), ECU 13 proceeds to step S103. If it determines that an overlay image has already been generated (if YES in step S102), the ECU 13 proceeds to step S104.
[093]In step S103, the calculation state determination section 29 determines whether the distance and orientation angle calculation state by the orientation angle calculator 22 is too unsatisfactory to satisfy the second criterion or not. For example, the calculation state determination section 29 determines whether the orientation angle calculator 22 succeeds in calculating the distance and orientation angle or not. If the calculation state determination section 29 determines that the orientation angle calculator 22 succeeds therein (if YES in step S103), the ECU 13 proceeds to step S104. If the calculation state determination section 29 determines that the orientation angle calculator 22 is not successful therein (if NOT at step S103), the ECU 13 proceeds to step S105.
[094]In step S104, the ECU 13 sets the starting point at the current position of the vehicle, and additionally defines the starting points for the integration operations on distance and orientation angle calculated in step S16 in the same cycle of process of information. Using distance and orientation angle as the starting points, the ECU 13 performs the integration operations again. Furthermore, using the vehicle's current position as the starting point, ECU 13 again performs the integration operation on the vehicle's momentum.
[095]In step S105, ECU 13 sets the starting point at the current vehicle position, and additionally defines the starting points for the integration operations in distance and orientation angle used in the previous information processing cycle. Using distance and orientation angle as the starting points, the ECU 13 performs the integration operations again. Furthermore, using the vehicle's current position as the starting point, ECU 13 again performs the integration operation on the vehicle's momentum. Afterwards, ECU 13 proceeds to step S19 in Figure 10. First Mode Effects
[096] As described above, according to the first embodiment, the brightness determination section 25 performs a determination on the detected condition of the standardized light beam. If the detected condition of the patterned light beam is equal to or greater than the predetermined threshold value, the patterned light beam extractor 21 generates the superimposed image by superimposing images in successive frames between the previous and current frames, and extracts the patterned beam. patterned light of the superimposed image. For these reasons, even when the outside environment is bright, the patterned light beam projected onto the road surface can be accurately detected. Consequently, the vehicle's autoposition can be precisely calculated.
[097]Furthermore, the number of images required for the patterned light beam extractor 21 to generate the superimposed image is defined depending on the detected condition of the patterned light beam, such as the average brightness of the image taken with the camera 12. For this reason, the brightness value of the standardized light beam to be detected can be controlled depending on how bright the external environment is. Consequently, the patterned light beam can be accurately detected.
[098]Furthermore, as in step S102 in Figure 11, the distance and orientation angle employed in the previous information processing cycle, or predetermined initial distance and predetermined initial orientation angle, are used as the starting points while the overlay image is being generated. For this reason, an error in the autoposition calculation can be inhibited.
[099] It should be noted that in step S105 in Figure 11, the ECU 13 can set the starting points for the integration operations at the predetermined initial distance and predetermined initial orientation angle instead of the used orientation distance and angle in the previous information processing cycle. In detail, if the detection condition determination section 28 determines that the condition under which the characteristic points are detected is too unsatisfactory to satisfy the first criterion, and, simultaneously, if the calculation state determination section 29 determines that the state of calculation of the distance and bearing angle by the bearing angle calculator 22 is too unsatisfactory to satisfy the second criterion, the autoposition calculator 26 can set the starting points for the integration operations at the predetermined initial distance and at the predetermined initial steering angle, taking at least occupants and vehicle load into account. For example, the autoposition calculator 26 may use the distance and orientation angle calculated in step S16 in the information processing cycle immediately after the autoposition calculator is activated. For that reason, the ECU 13 can update the distance and the steering angle, and calculate the amount of movement, setting the starting points in the distance and the steering angle that are not affected by the roll or pitch of the vehicle 10 due to turning, accelerating or decelerating the vehicle 10. Second Mode Hardware Configuration
[0100]Descriptions will be provided for a second embodiment of the present invention, or a case where the autoposition of a vehicle is calculated based on a change in a road surface condition around the vehicle. As shown in Figure 12, a self-position calculating apparatus of the second embodiment is different from that of the first embodiment in that the self-position calculating apparatus of the second embodiment includes a road surface condition determining section 30 instead of the sensing condition determining section 28 and calculating state determining section 29. The rest of the configuration of the self-position calculating apparatus of the second embodiment is the same as that of the first embodiment. For this reason, the description of the rest of the configuration will be omitted.
[0101]The road surface condition determination section 30 detects changes in the road surface conditions around the vehicle, and determines whether the road surface conditions change by as much or more than the threshold value or not. If the road surface condition determination section 30 determines that the road surface conditions change by as much or more than the threshold value, the autoposition calculator 26 remains at the current position of the vehicle 10, as well as the distance and the vehicle 10's current orientation angles relative to the road surface, which are calculated in the previous information processing cycle. Thereby, the steering angle calculator 22 stops calculating the distance and the steering angle of the vehicle 10 with respect to the road surface. However, the autoposition calculator 26 calculates the current position of the vehicle at the present, as well as the current orientation distance and angle of the vehicle 10 from the road surface, by adding the amount of change in orientation to the vehicle's current position. 10, as well as the current distance and orientation angle of vehicle 10 from the road surface, which are calculated in the previous information processing cycle.
[0102]In this regard, descriptions will be provided for how to determine changes in road surface conditions. In this embodiment, the 35 directional lights (5x7) of the patterned light beam 32a are projected onto the road surface. Taking this into account, for example, if only 80% or less of the 35 directional lights, or only 28 or fewer directional lights, can be detected in the image taken with camera 12, the road surface condition determination section 30 determines that: the road surface becomes very uneven or uneven; and road surface conditions change as much or more than the threshold value.
[0103]Instead, the road surface condition determination section 30 can estimate changes in road surface conditions from a number of changes in road surface heights. The amount of change in road surface heights can be detected from oscillations of a value detected by a travel sensor attached to the suspension of each wheel of the vehicle. For example, if the value swings detected by the course sensor become equal to or greater than 1 Hz, the road surface condition determination section 30 estimates that the road surface becomes very uneven or uneven, and determines that road surface conditions change as much or more than the threshold value. Alternatively, the road surface condition determination section 30 may be configured to: calculate a speed in the vertical direction by integrating the values detected by an acceleration sensor to measure acceleration in the vertical direction; and thereby determine that the road surface becomes very uneven or uneven and the road surface conditions change as much as or more than the threshold value when a change in speed direction becomes equal to or greater than 1 Hz .
[0104]Otherwise, the amount of change in road surface heights can be estimated from the position of the patterned light beam 32a in the image captured with the camera 12. In the embodiment, the patterned light beam 32a as shown in Figure 13 is projected onto the road surface 31. In this case, a line 71 joining the directional lights of the patterned light beam 32a in an X direction, and a line 73 joining the directional lights of the patterned light beam 32a in an X direction. Y direction are drawn. Subsequently, if, as indicated by a point 75, the slope of any of these lines changes 15 or more degrees in the middle of the line, the road surface condition determination section 30 estimates that the road surface becomes very non-uniform. or uneven, and determines that the road surface conditions change by as much or more than the threshold value. Instead, as shown in Figure 13, the road surface condition determining section 30 can determine that the road surface conditions change by as much or more than the threshold value if a difference between distances d1, d2 between directional lights adjacent areas change as much or more than 50%.
[0105] Since the road surface condition determination section 30 thereby determines that the road surface conditions change by as much or more than the threshold value, the autoposition calculator 26 remains at the current position of the vehicle 10, as well as the current distance and orientation angle of vehicle 10 from the road surface, which are calculated in the previous information processing cycle. Thereby, the steering angle calculator 22 stops calculating the distance and the steering angle of the vehicle 10 with respect to the road surface. However, the autoposition calculator 26 calculates the current position of vehicle 10 in the present, as well as the current distance and angle of orientation of vehicle 10 in the present relative to the road surface, by adding the amount of change in orientation to the current position of vehicle 10, as well as the current distance and orientation angle of vehicle 10 with respect to the road surface, which are calculated in the previous information processing cycle.
[0106]For example, as shown in Figure 14(e), the road surface condition determination section 30 monitors the number of directional lights detected, and sets the threshold value at 28 which corresponds to 80% of the 35 directional lights . In this case, while capable of detecting 28 or more directional lights, the road surface condition determination section 30 sets a guidance angle calculation flag to "1". In this way, the steering angle calculator 22 calculates the distance and the steering angle of the vehicle 10 in relation to the road surface. However, the autoposition calculator 26 calculates the vehicle's current autoposition by: calculating the current steering distance and angle using the vehicle steering distance and angle calculated by the steering angle calculator 22; and adding (continuing the integration operation) the amount of vehicle movement to the current position of vehicle 10 that is calculated in the previous information processing cycle.
[0107]However, at time t21 when the number of detected directional lights becomes less than the threshold value, the autoposition calculator 26 switches the orientation angle calculation flag to “0”. In this way, the starting points are maintained at the current position of the vehicle 10, as well as the current distance and orientation angle of the vehicle 10 with respect to the road surface, which are calculated in the previous information processing cycle. The steer angle calculator 22 stops calculating the distance and steer angle of the vehicle 10. In this way, the autoposition calculator 26 calculates the vehicle's current position at the present, as well as the vehicle's current steer distance and angle. at present relative to the road surface by adding the amount of change in orientation to the current position of vehicle 10, as well as the current distance and angle of orientation of vehicle 10 relative to the road surface, which are calculated in the previous information processing cycle.
[0108]Further, at time t22 when the number of detected directional lights becomes greater than i threshold value again, the guidance angle calculation flag is set to “1”. The bearing angle calculator 22 resumes calculating the distance and bearing angle of vehicle 10. In this way, the autoposition calculator 26 calculates the current distance and bearing angle of vehicle 10 using the distance and bearing angle of the vehicle 10. vehicle 10 calculated by the bearing angle calculator 22. As described above, when road surface conditions change to a greater extent, the mode autoposition calculator uses the current position of vehicle 10 as well as the distance and the current orientation angle of the vehicle 10 in relation to the road surface, which are calculated in the previous information processing cycle. For that reason, even when the road surface conditions change to a greater extent, the self-position calculating apparatus of the embodiment is capable of calculating the self-position of the vehicle 10 accurately and stably.
[0109]As in the first embodiment, the patterned light beam extractor 21 generates a superimposed image by superimposing a predetermined number of images, if a detected condition of the patterned light beam is equal to or greater than the predetermined threshold value. The patterned light beam extractor 21 extracts the patterned light beam from the generated superimposed image.
[0110]Also, in a case where a large change in the road surface prevents the generation of an overlay image, the previous values or the initial values are used as the starting points. For example, while the defined number of images to be superimposed is being captured, the road surface condition determination section 30 determines whether the road surface around the vehicle changes to an extent equal to or greater than a threshold value or not. . When determining that the road surface changes to that extent by 5% or more of the defined number of images, the road surface condition determination section 30 determines that the assumption that the road surface changes slightly becomes unrealistic. In this way, the previous values or the initial values are used as the starting points.
[0111]Figures 14(a) to 14(b) show a change in a reset flag, a change in the number of images to be overlaid, a change in a road surface condition between a good condition and an unsatisfactory condition , and changes in the sizes of bumps and unevenness of the road surface, in the autoposition calculator of the second embodiment. As shown in Figure 14(b), it is determined at predetermined intervals whether the superimposed image should be redefined or not. In this regard, each predetermined interval is a period of 10 frames. However, each predetermined interval can be a period of 10 seconds.
[0112]Between times t21 and t22, as well as between times t24 and t25, the bumps and unevenness of the road surface are equal to or less than the threshold value, as shown in Figure 14(e), and it is determined that the road surface condition is unsatisfactory, as shown in Figure 14(d). Thus, even though reset timings occur at predetermined intervals, as shown in Figure 14(b), the reset flag is left unchanged at “0”, as shown in Figure 14(a), and no overlay image is generated.
[0113] However, at time t23, one of the reset timings at predetermined intervals occurs, as shown in Figure 14(b), and the road surface condition is determined to be good, as shown in Figure 14(d) . Thus, the reset flag is set to "1", as shown in Figure 14(a). As shown in Figure 14(c), the standardized light beam extractor 21 sets the number of images to be superimposed to 3 based on the average brightness value of the image obtained with the camera 12. In addition, the beam extractor patterned light 21 generates a superimposed image by superimposing two images obtained with camera 12 in the last and penultimate frames on an image obtained with camera 12 at time t23. Thereafter, the patterned light beam extractor 21 extracts the patterned light beam from the superimposed image generated thereby. Information Process Cycle
[0114] In the following, using Figures 15 and 16, descriptions will be provided for an autoposition calculation method of the second embodiment of the present invention. The procedures for steps S20 to S27 and S29 shown in Figure 15 are the same as those for steps S10 to S17 and S19 shown in Figure 10. For this reason, descriptions for such procedures will be omitted.
[0115]In step S28, ECU 13 sets the starting points for integration operations to calculate the vehicle's autoposition depending on a change in the road surface condition around the vehicle. Referring to a flowchart in Figure 16, descriptions will be provided for a detailed procedure for step S28 in Figure 15.
[0116]As shown in Figure 16, in step S201, the road surface condition determination section 30 determines whether a predetermined interval has elapsed or not. The predetermined interval can be set to a length of time that allows the camera to obtain a sufficient number of images necessary for the patterned light beam extractor 21 to generate an overlay image, for example. The road surface condition determination section 30 monitors whether an interval count pulse occurs. Once the interval count pulse, the road surface condition determination section 30 determines that a predetermined interval has elapsed, as described in Figure 14. Thereby, the ECU proceeds to step S202. On the other hand, if no interval count pulse occurs, the road surface condition determination section 30 determines that no predetermined interval has yet elapsed. Thus, the ECU proceeds to step S205.
[0117]In step S202, the road surface condition determination section 30 detects a change in the road surface condition around the vehicle. Specifically, the road surface condition determination section 30 detects how many of the directed lights in the patterned light beam 32a are detected, or detects fluctuations in the values detected by the course sensor attached to each wheel. Otherwise, the road surface condition determination section 30 may be configured to calculate a speed in the vertical direction by integrating values detected by an acceleration sensor to measure the vehicle's acceleration in the vertical direction, or to detect the position of the patterned light beam 32a.
[0118]Subsequently, in step S203, the road surface condition determination section 30 determines whether the road surface conditions change by as much or more than the threshold value or not. For example, in a case where the road surface condition determination section 30 is configured to detect the number of steered lights from the patterned light beam 32a, if only 28 steered lights or less of the 35 steered lights can be detected in the image obtained with the camera, the road surface condition determination section 30 determines that: the road surface becomes very uneven or uneven; and road surface conditions change as much or more than the threshold value.
[0119]Otherwise, in case of using the course sensor, the road surface condition determination section 30 determines that the road surface conditions change as much or more than the threshold value if the detected value oscillates become equal to or greater than 1 Hz. Also, in a case where the road surface condition determination section 30 is configured to use the acceleration sensor, the road surface condition determination section 30 calculates the speed in the vertical direction by integrating the values detected by the acceleration sensor. If changes in speed direction become equal to or greater than 1 Hz, the road surface condition determination section 30 determines that the road surface conditions change by as much or more than the threshold value.
[0120]Further, in a case where the road surface condition determination section 30 is configured to use the position of the standardized light beam 32a, if the slope of one of the lines joining the directional lights changes 15 or more degrees into its midline, the road surface condition determination section 30 determines that the road surface conditions change by as much or more than the threshold value. Otherwise, if a difference between distances between adjacent directional lights changes by as much or more than 50%, the road surface condition determination section 30 may determine that the road surface conditions change by as much or more than the value threshold.
[0121]As described above, the road surface condition determination section 30 determines whether the road surface conditions around the vehicle change by as much or more than the threshold value or not. If the road surface condition determination section 30 determines that the road surface conditions around the vehicle change by as much or more than the threshold value (YES in step S203), the procedure proceeds to step S204. On the other hand, if the road surface condition determination section 30 determines that the road surface conditions around the vehicle does not change by as much or more than the threshold value (NOT in step S203), the procedure proceeds to step S205.
[0122]In step S204, the autoposition calculator 26 maintains the current position of vehicle 10, as well as the current distance and orientation angle of vehicle 10 with respect to the road surface, at the current position of vehicle 10, as well as the vehicle 10's current distance and orientation angle relative to the road surface, which are calculated in the previous information processing cycle. In other words, the autoposition calculator 26 sets the current position of vehicle 10, as well as the current distance and orientation angle of vehicle 10 from the road surface, which are calculated in the previous information processing cycle, as the starting points of the integration operation.
[0123]Therefore, the steering angle calculator 22 stops calculating the distance and the steering angle of the vehicle 10 in relation to the road surface. However, the autoposition calculator 26 calculates the current position of vehicle 10 at present, as well as the current distance and angle of orientation of vehicle 10 at present relative to the road surface, by adding the amount of change in orientation to the position current of vehicle 10, as well as the current distance and orientation angle of vehicle 10 from the road surface, which are calculated in the previous information processing cycle.
[0124] On the other hand, in step S205, the autoposition calculator 26 defines the starting points of the integration operation at the current position of the vehicle 10, as well as the current distance and orientation angle of the vehicle 10 in relation to the surface of road, which are calculated in step S15 in the current information processing cycle. In this way, the autoposition calculator 26 calculates the current position of vehicle 10, as well as the current distance and orientation angle of vehicle 10 from the road surface, by adding the amount of change in orientation to the vehicle's current position. 10, as well as the current orientation distance and angle of vehicle 10 from the road surface, which are calculated in the current information processing cycle.
[0125]Since, as described above, the autoposition calculator 26 sets the starting points of the integration operation to calculate the current position of vehicle 10 at present, as well as the current distance and orientation angle of vehicle 10 in the present present relative to the road surface, the process at step S28 is completed, and the procedure proceeds to step S29 in Figure 15. Second Mode Effects
[0126] As described above, according to the second embodiment, the brightness determination section 25 performs a determination on the detected condition of the standardized light beam. If the detected condition of the patterned light beam is equal to or greater than the predetermined threshold value, the patterned light beam extractor 21 generates the superimposed image by superimposing images in the respective frames. For this reason, even when the outside environment is bright, the patterned light beam projected onto the road surface can be accurately detected. Consequently, the vehicle's autoposition can be precisely calculated.
[0127]Furthermore, the number of images required for the patterned light beam extractor 21 to generate the superimposed image is set depending on the pattern detection condition, such as the average brightness of the image taken with the camera 12. For this reason , the brightness value of the standardized light beam to be detected can be controlled depending on how bright the external environment is. Consequently, the patterned light beam can be accurately detected. Third Mode Hardware Configuration
[0128]Descriptions will be provided for a third embodiment of the present invention, or a case where among images obtained by synchronous detection, images corresponding to a predetermined number of cycles are superimposed to generate an overlapping image. As shown in Figure 17, an autoposition calculating apparatus of the third embodiment is different from that of the first embodiment in that the self-position calculating apparatus of the third embodiment does not include either the detection condition determining section 28 or the detection section. calculation status determination 29. The rest of the configuration of the self-position calculating apparatus of the third embodiment is the same as that of the first embodiment. For this reason, the description for the rest of the configuration will be omitted.
[0129]The patterned light beam controller 27 starts projecting the patterned light beam 32a whose brightness changes periodically. Thereafter, the patterned light beam controller 27 continues to project the patterned light beam 32a until the autoposition calculating apparatus stops its operation. Otherwise, the patterned light beam controller 27 can be configured to project the patterned light beam 32a depending on the need. In the embodiment, electrical power to be supplied to the light projector 11 is controlled so that, as an example of the change in brightness, the brightness of a projected light pattern changes in the same way as a sine wave of a predetermined frequency.
[0130]The standardized light beam extractor 21 reads the images obtained with the camera 12 from the memory, performs a synchronous detection process on the images at the predetermined frequency mentioned above, and extracts the standardized light beam 32a from the processed images of that mode.
[0131]The following descriptions will be provided for the synchronous detection process. A measured signal included in images captured with camera 12 is expressed with sine (w0 + α). This measured signal contains signals representing sunlight and artificial light both having various frequency components in addition to a signal representing the standard light pattern of projected light. Taking this into account, the measured signal sine (w0 + α) is multiplied by a reference signal sine (wr + β) whose frequency is a modulation frequency wr. So multiplication produces:

[0132] When it passes the measured signal obtained in this way, a low-pass filter removes signals that have frequencies w0 that are unequal to wr, or sunlight and artificial light that are not in the projected light pattern, and whose frequencies are unequal with respect to wr. In contrast, a signal that has a frequency w0 that is equal to wr, or the pattern of projected light whose frequency is equal to wr, can be extracted because the pattern of projected light is represented by cosine (α - β)/2. The use of synchronous detection process such as this makes it possible to obtain an image that represents only the extracted projected light pattern.
[0133]In other words, in the embodiment, the patterned light beam controller 27 modulates the brightness of the patterned light beam 32a with the predetermined modulation frequency wr which is defined in advance. In this way, the projected light pattern whose brightness is modulated with the wr frequency is projected onto the road surface. However, the orientation angle calculator 22 can extract only the projected light pattern by multiplying an image captured with the camera 12 (the measured signal) by the modulation frequency wr.
[0134]Figure 18(a) is a characteristic diagram showing a change in the brightness of the patterned light beam 32a projected by the light projector 11. Figure 18(b) is a characteristic diagram showing a change in a beacon of light. characteristic point detection. For example, the brightness is controlled so that the brightness changes in the same way as the sine wave, as shown in Figure 18(a),
[0135]In this regard, descriptions will be provided for how to set the brightness of the standardized light beam 32a. In this embodiment, the maximum brightness of the patterned light beam 32a (upper peaks of a brightness B1 shown in Figure 18(a)) is set so that the patterned light beam 32a can be detected even under the clear sky around the summer solstice (in June) when the amount of sunlight is the highest during the year. Furthermore, the minimum brightness of the patterned light beam (lower peaks of the brightness B1 shown in Figure 18(a)) is defined so that the projected light pattern is not erroneously detected as characteristic points on the road surface with a probability 99% or more at night when the influence of the standardized light beam is greatest in a day.
[0136]The frequency for modulating the brightness of the standardized light beam is set to 200 [Hz], and the frame rate of camera 12 (the number of images captured per second) is set to 2400 [fps], for example. The reason for these settings is that the modality assumes the maximum vehicle speed to be 72 km/h (approximately equal to 20 m/s) and therefore reduces the amount of vehicle movement per frame cycle to 0.1 m or less. This is due to the fact that, judging by the modality concept, it is desirable that the area over which the patterned light beam 32a is projected and the area from which the characteristic points are detected to be as close together as possible, or coincide with each other. Furthermore, judging by the concept of synchronous detection, a smaller amount of movement per cycle and changes in the road surface during its movement are likely to allow for changes in sunlight and artificial light different from the projected light pattern, and thus realize the precondition for synchronous detection becomes invalid. The modality avoids this probability by reducing the amount of movement per cycle. For this reason, the use of a higher speed camera and a further reduction in the cycle makes it possible to expect a further improvement in performance.
[0137]The characteristic spot detector 23 determines whether the brightness of the patterned light beam 32a projected by the light projector 11 is not greater than the pre-set B-th threshold brightness value. If the brightness is greater than the B-th threshold brightness value, the patterned light beam extractor 21 extracts the patterned light beam 32a through the synchronous detection process described above. For example, as shown in Figures 19(a) and 19(c), the patterned light beam extractor 21 extracts the patterned light beam 32a through the synchronous detection process from an image captured at time t1 or at time t3 (when the B1 brightness is greater than the B-th brightness threshold value) shown in Figure 18(a).
[0138]On the other hand, if the gloss is equal to or less than the B-th threshold brightness value, the feature point detector 23 detects feature points present on the road surface 31. Specifically, in the case where the gloss B1 of the patterned light beam 32a changes in the same way as the sine wave, as shown in Figure 18(a), the feature spot detector 23 detects feature spots during the time periods when the brightness B1 is not greater than the B -th threshold brightness value. In other words, as shown in Figure 18(b), the feature spot detector 23 keeps the feature spot detection flag at "1" during the time periods when the brightness B1 is not greater than the B-th value. brightness threshold, and detects characteristic points during these time periods. For example, as shown in Figure 19(b), the feature point detector 23 detects the feature points of an image captured at time t2 (when the brightness B1 is equal to or greater than the B-th threshold brightness value) shown in Figure 18(b). Subsequently, based on the positions in the image of the feature points detected by the feature point detector 23, the orientation change amount calculator 24 calculates an amount of change at each feature point with respect to the camera 12. In this regard, the area from from which the feature points are detected fully coincides or partially overlaps the area over which the patterned light beam 32a is projected.
[0139] In the embodiment, when the brightness determination section 25 determines that the detected condition of the standardized light beam is equal to or greater than the predetermined threshold value, the standardized light beam extractor 21 generates an overlay image by: performing the synchronous detection in the images captured with the camera 12 with the predetermined modulation frequency to obtain synchronized images corresponding to the predetermined number of cycles; and overlay the synced images (sum their brightness values). An image that represents only the extracted projected light pattern can be obtained from an image captured during a reference signal cycle. In a case where the process is carried out in two or more cycles, the images are superimposed by adding the brightness values of the pixels in the images extracted in the respective cycles. Consequently, the extraction of directed lights from the superimposed image can be achieved by applying a binarization process to a standardized result that is obtained by dividing by the number of superimposed images whose pixel brightness values are summed.
[0140]Descriptions will be given below for an example of how to set default cycles for overlay. To begin with, referring to the ratio Rap obtained using the nearest decimal to which the average brightness value of the image taken with camera 12 is rounded, Fn is obtained using Equation (7) given below. The number of predetermined cycles is set to the nearest integer value to which the Fn obtained in this way is rounded off. Specifically, when the average brightness of the asphalt-paved road surface is less than 50% of the average brightness of the standardized light beam, the number of default cycles is set to 1. When 75%, the number of default cycles is set to 2. When 90% or greater, the number of default cycles is set to 5:

[0141] It should be noted that the number of predetermined cycles can be set to a number that makes a success rate of extracting the targeted lights from the given standardized light beam become 95% or greater by actually performing synchronous detection for extract in an experiment to obtain Rap ratios using the average brightness value Ba and values obtained by actually sequentially adding 10 scale points to the average brightness value Ba.
[0142]In this aspect, if no overlay image can be generated because the vehicle speed is too fast, the previous values or the initial values are used. For example, when the average brightness of the asphalt-paved road surface is 75% of the average brightness of the standardized light beam, two cycles are required for synchronous detection. Furthermore, when the number of images required per cycle in synchronous detection is at 4, and the vehicle speed, which is obtained using Equation (8) given below, is equal to or greater than 90 km/h, the values above or the initial values are used as the starting points:

[0143]However, if no overlay image can be generated because the change in the road surface is large, the previous values or the initial values are used. It is determined whether the road surface condition around the vehicle changes to an extent equal to or greater than the threshold value during the defined number of cycles for synchronous detection or not. If 5% or more of the images captured during the defined number of cycles for synchronous detection show that the road surface condition around the vehicle changes to the extent equal to or greater than the threshold value, it is determined that the assumption that the road surface changes slightly to become invalid. In this way, the previous values or the initial values are used as the starting points. In the case where synchronous detection is used, determination about the road surface condition can be carried out using the course sensor.
[0144]Figures 20(a) to 20(d) respectively show a change in a reset flag, a change in the timing of the end of each cycle, a change in the number of frequencies to be superimposed and a change in the projection power of light in the autoposition calculator of the third modality. Each of the times t31, t32, t33 is the timing at the end of each reference signal projection cycle, as shown in Figure 20(b). At times t31, t32, t33, the cyclic light projection power becomes smaller as shown in Figure 20(d), and the reset flag is set to "1" as shown in Figure 20(a).
[0145]At times t31, t33, the number of cycles to be overlapped is set to 1, as shown in Figure 20(c). Thus, the patterned light beam extractor 21 generates the superimposed image by superimposing an image that corresponds to a cycle immediately before time t31, and an overlapping image by superimposing an image that corresponds to a cycle immediately before time t33 .
[0146]On the other hand, at time t32, the number of cycles to be overlapped is set to 2 as shown in Figure 20(c). In this way, the patterned light beam extractor 21 generates the superimposed image by superimposing images corresponding to two cycles T1, T2 immediately before time t32. Information Process Cycle
[0147] The following, as an example of a self-position calculation method of estimating the amount of movement of the vehicle 10 from the image 38 (see Figure 5) obtained with the camera 12, an information process cycle to be repeatedly performed by ECU 13 will be described with reference to a flowchart shown in Figure 21.
[0148]The information process cycle shown in Figure 21 is initiated at the same time that the self-position calculator 100 becomes activated after the vehicle ignition key 10 is turned on, and is repeatedly carried out until the position calculator autoposition 100 stops its operation.
[0149]First, in step S31 in Figure 21, the patterned light beam controller 27 controls the light projector 11 to cause the light projector 11 to project the patterned light beam onto the road surface 31 at the same time. around the vehicle. In this case, the standardized light beam controller 27 controls the light projection power so that, as shown in Figure 18(a), the brightness B1 of the standardized light beam changes in the same way as the sine wave of a predetermined frequency. For example, the frequency of the sine wave is set to 200 [Hz]. Thus, the patterned light beam whose brightness B1 changes with time in the same way as the wave is projected onto the road surface 31.
[0150]In step S32, the camera 12 captures an image of the road surface 31 that includes an area over which the patterned light beam is projected.
[0151]In step S33, ECU 13 determines whether a reference signal projection cycle for synchronous detection has completed or not. If ECU 13 determines that a reference signal projection cycle for synchronous detection has completed, ECU 13 proceeds to step S35. If ECU 13 determines that a reference signal projection cycle for synchronous detection has not yet been completed, ECU 13 proceeds to step S34.
[0152]In step S34, feature point detector 23 detects feature points (e.g. irregular portions present in asphalt) from image 38, and extracts feature points that can be associated between previous and current information process cycles , and updates the distance and orientation angle from the positions in the image (Ui, Vi) of the feature points extracted in this way.
[0153]Specifically, when the feature spot detection flag is set to "1", feature spot detector 23 reads the image 38 taken with camera 12 from memory, detects feature points on the road surface 31 from the image 38, and stores the image positions (Ui, Vi) of the respective feature points in memory. The orientation change amount calculator 24 reads the image positions (Ui, Vi) of the respective feature points from from memory, and calculates the relative positions (Xi, Yi, Zi) of the feature points in relation to camera 12 from the distance and orientation angle, as well as the positions in the image (Ui, Vi) of the feature points. orientation change amount 24 stores the relative positions (Xi, Yi, Zi) of feature points with respect to camera 12 in memory.
[0154] Subsequently, the orientation change amount calculator 24 reads the positions in the image (Ui, Vi) of the characteristic points and the relative positions (Xi, Yi, Zi) of the characteristic points calculated in step S31 in the process cycle of previous information from memory. The orientation change amount calculator 24 calculates the amounts of change in distance and orientation angle using the relative positions (Xi, Yi, Zi) and image positions (Ui, Vi) of feature points that can be associated between the past and current information process cycles. The orientation change amount calculator 24 updates the orientation distance and angle by adding the above mentioned change amounts in the orientation distance and angle to the orientation distance and angle obtained in the previous information processing cycle. Thereafter, the orientation change amount calculator 24 stores the thus updated orientation distance and angle in memory. In other words, the orientation change amount calculator 24 updates the orientation distance and angle by performing the integration operation on the distance and orientation angle defined by the process in step S34 or S37 (described later) in the previous cycle by adding the amounts of change in distance and orientation angle calculated in the current information processing cycles. Afterwards, ECU 13 proceeds to step S38.
[0155]On the other hand, in step S35, from the average brightness of the image obtained with the camera 12, the standardized light beam extractor 21 sets the frequency for the synchronous detection that is necessary to extract the standardized light beam .
[0156]In step S36, the patterned light beam extractor 21 extracts the patterned light beam from a group of images obtained by synchronous detection in the current reference signal cycle.
[0157]In step S37, the patterned light beam extractor 21 generates the superimposed image by superimposing images of the patterned light beam extracted by synchronous detection in the past cycles, where the number of superimposed images thus corresponds to the number of cycles defined in step S35. The patterned light beam extractor 21 further extracts the position of the patterned light beam from the superimposed image generated thereby. Based on the position of the standardized light beam, the orientation angle calculator 22 calculates the distance and the orientation angle.
[0158]In step S38, ECU 13 selects starting points for integration operations. This process is achieved by: selecting the distance and orientation angle calculated from the position of the standardized light beam in the first information processing cycle; and define the starting points for the integration operations on the distance and orientation angle selected in this way. Subsequently, in a case where the predetermined condition is satisfied, or in a case where the condition of feature point detection by feature point detector 23 deteriorates to prevent multiple feature points from being detected in timing when the flag is set to “1”, the ECU 13 resets the starting points for calculating the amount of movement at the distance and guidance angle calculated from the standardized light beam position, or at the distance and guidance angle calculated by the process in step S37. On the other hand, in a case where the feature point detector 23 detects the feature points normally, the ECU 13 updates the distance and orientation angle based on the positions of the respective feature points.
[0159]In other words, in the case that feature point detector 23 does not detect feature points normally, the distance and orientation angle of camera 12 cannot be set with high precision. If the amount of vehicle movement is calculated using inaccurate steering distance and angle, the amount of vehicle movement cannot be detected with high accuracy. In such a case, therefore, the starting points for calculating the amount of motion are defined in the distance and orientation angle calculated from the position of the standardized light beam. In this way, large errors in distance and orientation angle are prevented.
[0160]Subsequently, in step S39, the autoposition calculator 26 calculates the amount (ΔL) of movement of the camera 12 relative to the road surface 31, or the amount of movement of the vehicle 10, from the distance and angle of orientation calculated by the process in step S34 or S37, the starting points of the integration operations, and the amounts of changes in the image positions (Ui, Vi) of the characteristic points.
[0161]Therefore, the self-position calculating apparatus of the third embodiment is capable of calculating the position of the vehicle 10 by repeatedly performing the series of information processing cycles described above to sum up the vehicle 10 movement quantities. Third Mode Effects
[0162] As described above, according to the third embodiment, the brightness determination section 25 makes a determination on the detected condition of the standardized light beam. If the detected condition of the patterned light beam is equal to or greater than the predetermined threshold value, the patterned light beam extractor 21 generates the superimposed image by superimposing images in the respective frames, wherein the number of the superimposed images thus corresponds to the predetermined number of cycles. For this reason, even when the outside environment is bright, the patterned light beam projected onto the road surface can be accurately detected. Consequently, the vehicle's autoposition can be precisely calculated.
[0163]Furthermore, the number of cycles required for the patterned light beam extractor 21 to generate the superimposed image is defined depending on the pattern detection condition, such as the average brightness of the image taken with the camera 12. Therefore reason, the standardized light beam brightness value to be detected to be controlled depending on how bright the external environment is. Consequently, the patterned light beam can be accurately detected. Other Modality
[0164]While the first to third embodiments of the present invention have been described as above, none of the descriptions and drawings that form part of the disclosure should be construed as limiting the present invention. The disclosure will make various modalities, examples and alternative operational techniques clear to those skilled in the art.
[0165] The first to third embodiments of the present invention mainly discussed the case where the superimposed image is generated by superimposing images obtained with the camera 12 from the past to the present. However, the overlay image can be generated by overlaying images that include one or more images to be taken with the camera 12 in the future. The inclusion of images to be obtained in the future is assumed to occur, for example in a case where the standardized light beam is projected intermittently and not constantly. In this case, the patterned light extractor 21 generates the superimposed image once all the images necessary to extract the patterned light beam are obtained. The autoposition calculator can be configured to start the integration operations using the previous values or the initial values set as the starting point before all the images needed to extract the standardized light beam are obtained (while the overlay image is being generated ).
[0166]Although Figure 2 shows the example where camera 12 and light projector 11 are installed in front of vehicle 10, camera 12 and light projector 11 can be installed on the sides, rear or underside of the vehicle 10. Furthermore, although Figure 2 shows the four-wheeled passenger car as an example of the vehicle 10 of the embodiments, the present invention is applicable to all moving bodies (vehicles) such as motorcycles, trucks and special vehicles for transporting construction machines, provided that characteristic points on road surfaces and wall surfaces can be captured from such moving bodies. LIST OF NUMERICAL REFERENCES 13ECU 10vehicle 11light projector 12camera (image capture unit) 21standard light beam extractor (superimposed image generator) 22orientation angle calculator 23characteristic spot detector 24orientation change amount calculator 25brightness determination section (standard light beam detection condition determination section) 26 autoposition calculator 28 detection condition determination section 29 calculation status determination section 30 road surface condition determination section 31 road surface 32a, 32b standardized light beam Teponto characteristic
权利要求:
Claims (5)
[0001]
1. Self-position calculating apparatus adapted to calculate the self-position of a vehicle (10) CHARACTERIZED in that it comprises: a light projector (11) configured to project a standardized beam of light (32a, 32b) onto a road surface (31) around the vehicle (10); an image capture unit (12) adapted to be installed in the vehicle (10), and configured to capture an image of the road surface (31) around the vehicle which includes an area over which the patterned light beam (32a) , 32b) is designed; a patterned light beam extractor (21) configured to extract a patterned light beam position (32a, 32b) from the image obtained with the image capture unit (12); a steering angle calculator (22) configured to calculate a steering angle of the vehicle (10) with respect to the road surface (31) from the extracted position of the patterned light beam (32a, 32b); an orientation change amount calculator (24) configured to calculate a vehicle orientation change amount based on temporal changes at a plurality of characteristic points (Te) on the road surface in the image obtained with the image capture unit (12); and an autoposition calculator (26) configured to calculate a current position and a current orientation angle of the vehicle by adding the amount of change in orientation to an initial position and an initial orientation angle of the vehicle, the initial orientation angle being calculated by the orientation angle calculator (22), wherein if a detected brightness condition of the standardized light beam (32a, 32b) is equal to or greater than a threshold value, the standardized light beam extractor (21) generates a superimposed image by superimposing the images in frames obtained with the image capture unit (12), and extracts the position of the patterned light beam (32a, 32b) from the superimposed image.
[0002]
2. Autoposition calculation device, according to claim 1, CHARACTERIZED by the fact that the standardized light beam extractor (21) defines the number of images to be superimposed to generate the superimposed image that depends on a brightness value of the image obtained with the image capture unit (12).
[0003]
3. Autoposition calculator according to claim 1 or 2, CHARACTERIZED by the fact that the autoposition calculator (26) begins to add the amount of change in orientation using the orientation angle employed in a cycle of previous information process or the initial orientation angle as a starting point while the patterned light beam extractor (21) is generating the superimposed image.
[0004]
4. Autoposition calculating apparatus, according to any one of claims 1 to 3, CHARACTERIZED in that it additionally comprises a standardized light beam controller (27) configured to modulate the brightness of the standardized light beam (32a, 32b ) with a predetermined modulation frequency, in which the standardized light beam extractor (21) generates the superimposed image by superimposing synchronous images obtained by performing the synchronous detection with the predetermined modulation frequency on the images obtained with the capture unit (12), wherein the number of synchronous images thereby superimposed corresponds to a predetermined number of cycles.
[0005]
5. Autoposition calculation method adapted to calculate the autoposition of a vehicle (10), CHARACTERIZED in that it comprises: a step of projecting a standardized beam of light (32a, 32b) onto a road surface around the vehicle ( 10); a step of causing an image capturing unit (12) to capture an image of the road surface around the vehicle that includes an area onto which the patterned light beam (32a, 32b) is projected; a step of extracting a position of the patterned light beam (32a, 32b) from the image obtained with the image capture unit (12); a step of calculating an orientation angle of the vehicle with respect to the road surface (31) from the extracted position of the patterned light beam (32a, 32b); a step of calculating an amount of change in vehicle orientation based on temporal changes at a plurality of characteristic points on the road surface in the image captured with the image capturing unit (12); and a step of calculating a current position and a current orientation angle of the vehicle (10) by adding the amount of change in orientation to an initial position and an initial orientation angle of the vehicle, the initial orientation angle being calculated by the step angle calculation method, wherein in the step of extracting the position of the standardized light beam (32a, 32b), if a detected condition of the standardized light beam (32a, 32b) is equal to or greater than a threshold value, a superimposed image is generated by superimposing the images in frames obtained with the image capture unit (12), and the position of the standardized light beam is extracted from the superimposed image.
类似技术:
公开号 | 公开日 | 专利标题
BR112017002129B1|2022-01-04|SELF-POSITION CALCULATION APPARATUS AND SELF-POSITION CALCULATION METHOD
EP3113146B1|2018-07-04|Location computation device and location computation method
EP3113150B1|2018-06-27|Local location computation device and local location computation method
EP3113149B1|2018-10-10|Local location computation device and local location computation method
CN105324792B|2018-05-11|For estimating method of the moving element relative to the angular displacement of reference direction
RU2628035C1|2017-08-14|Device for calculation of own position and method of calculation of own position
EP3113147A1|2017-01-04|Local location computation device and local location computation method
RU2628553C1|2017-08-18|Own position calculating device and own position calculating method
JP6369897B2|2018-08-08|Self-position calculation device and self-position calculation method
JP6398218B2|2018-10-03|Self-position calculation device and self-position calculation method
JP6398217B2|2018-10-03|Self-position calculation device and self-position calculation method
BR112016019494B1|2022-01-04|SELF-POSITION CALCULATION APPARATUS AND SELF-POSITION CALCULATION METHOD
JP6299319B2|2018-03-28|Self-position calculation device and self-position calculation method
BR112016019548B1|2022-01-11|SELF-LOCATION CALCULATION DEVICE AND SELF-LOCATION CALCULATION METHOD
BR112016018603B1|2022-01-04|EQUIPMENT FOR CALCULATING OWN POSITION AND PROCESS FOR CALCULATING OWN POSITION
JP6939198B2|2021-09-22|Object detection method and object detection device
JP6398219B2|2018-10-03|Self-position calculation device and self-position calculation method
同族专利:
公开号 | 公开日
JPWO2016020970A1|2017-05-25|
US9933252B2|2018-04-03|
MX2017001248A|2017-05-01|
CN106663376B|2018-04-06|
MX357830B|2018-07-26|
JP6269838B2|2018-01-31|
RU2636235C1|2017-11-21|
EP3179462A1|2017-06-14|
US20170261315A1|2017-09-14|
EP3179462B1|2018-10-31|
BR112017002129A2|2017-11-21|
EP3179462A4|2017-08-30|
WO2016020970A1|2016-02-11|
CN106663376A|2017-05-10|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US8066415B2|1999-06-17|2011-11-29|Magna Mirrors Of America, Inc.|Exterior mirror vision system for a vehicle|
JPH06325298A|1993-05-13|1994-11-25|Yazaki Corp|Device for monitoring vehicle periphery|
RU2247921C2|2002-06-26|2005-03-10|Анцыгин Александр Витальевич|Method for finding one's bearings on the ground and device for its realization|
JP2004177252A|2002-11-27|2004-06-24|Nippon Telegr & Teleph Corp <Ntt>|Attitude measuring instrument and method|
JP2004198211A|2002-12-18|2004-07-15|Aisin Seiki Co Ltd|Apparatus for monitoring vicinity of mobile object|
US20060095172A1|2004-10-28|2006-05-04|Abramovitch Daniel Y|Optical navigation system for vehicles|
WO2006118076A1|2005-04-28|2006-11-09|Aisin Seiki Kabushiki Kaisha|System for monitoring periphery of vehicle|
JP5151050B2|2006-03-23|2013-02-27|日産自動車株式会社|Vehicle environment recognition apparatus and vehicle environment recognition method|
JP4780614B2|2006-04-10|2011-09-28|アルパイン株式会社|Body behavior measuring device|
JP4914726B2|2007-01-19|2012-04-11|クラリオン株式会社|Current position calculation device, current position calculation method|
JP5251419B2|2008-10-22|2013-07-31|日産自動車株式会社|Distance measuring device and distance measuring method|
EP2423901B1|2009-04-23|2017-03-15|Panasonic Intellectual Property Management Co., Ltd.|Driving support device, driving support method and program|
EP2722646B1|2011-06-14|2021-06-09|Nissan Motor Co., Ltd.|Distance measurement device and environment map generation apparatus|
JP2013147114A|2012-01-18|2013-08-01|Toyota Motor Corp|Surrounding environment acquisition device and suspension control device|
JP5992184B2|2012-03-09|2016-09-14|株式会社トプコン|Image data processing apparatus, image data processing method, and image data processing program|
US9649980B2|2012-05-18|2017-05-16|Nissan Motor Co., Ltd.|Vehicular display apparatus, vehicular display method, and vehicular display program|
AT514834B1|2013-02-07|2017-11-15|Zkw Group Gmbh|Headlight for a motor vehicle and method for generating a light distribution|
WO2014152470A2|2013-03-15|2014-09-25|Tk Holdings, Inc.|Path sensing using structured lighting|
RU2628553C1|2014-02-24|2017-08-18|Ниссан Мотор Ко., Лтд.|Own position calculating device and own position calculating method|
EP2919057B1|2014-03-12|2022-01-19|Harman Becker Automotive Systems GmbH|Navigation display method and system|
US20170094227A1|2015-09-25|2017-03-30|Northrop Grumman Systems Corporation|Three-dimensional spatial-awareness vision system|
US9630619B1|2015-11-04|2017-04-25|Zoox, Inc.|Robotic vehicle active safety systems and methods|CN107076539B|2014-09-24|2020-03-10|庞巴迪公司|Laser vision inspection system and method|
MX2018002016A|2015-08-21|2018-08-23|Adcole Corp|Optical profiler and methods of use thereof.|
US11100673B2|2015-09-24|2021-08-24|Apple Inc.|Systems and methods for localization using surface imaging|
US10832426B2|2015-09-24|2020-11-10|Apple Inc.|Systems and methods for surface monitoring|
EP3174007A1|2015-11-30|2017-05-31|Delphi Technologies, Inc.|Method for calibrating the orientation of a camera mounted to a vehicle|
EP3173979A1|2015-11-30|2017-05-31|Delphi Technologies, Inc.|Method for identification of characteristic points of a calibration pattern within a set of candidate points in an image of the calibration pattern|
JP6707378B2|2016-03-25|2020-06-10|本田技研工業株式会社|Self-position estimation device and self-position estimation method|
DE102016006390A1|2016-05-24|2017-11-30|Audi Ag|Lighting device for a motor vehicle to increase the visibility of an obstacle|
JP6601352B2|2016-09-15|2019-11-06|株式会社デンソー|Vehicle posture estimation device|
JP6838340B2|2016-09-30|2021-03-03|アイシン精機株式会社|Peripheral monitoring device|
JP6840024B2|2017-04-26|2021-03-10|株式会社クボタ|Off-road vehicle and ground management system|
JP6499226B2|2017-06-02|2019-04-10|株式会社Subaru|Car camera calibration device and car camera calibration method|
FR3068458B1|2017-06-28|2019-08-09|Micro-Controle - Spectra Physics|METHOD AND DEVICE FOR GENERATING AN IMPULSE SIGNAL TO PARTICULAR POSITIONS OF A MOBILE ELEMENT|
JP6849569B2|2017-09-29|2021-03-24|トヨタ自動車株式会社|Road surface detector|
EP3534334A1|2018-02-28|2019-09-04|Aptiv Technologies Limited|Method for identification of characteristic points of a calibration pattern within a set of candidate points derived from an image of the calibration pattern|
EP3534333A1|2018-02-28|2019-09-04|Aptiv Technologies Limited|Method for calibrating the position and orientation of a camerarelative to a calibration pattern|
CN108426556B|2018-03-09|2019-05-24|安徽农业大学|A kind of dynamometry vehicle wheel rotation angle measurement method based on acceleration|
JP2019194540A|2018-05-02|2019-11-07|オムロン株式会社|Three-dimensional shape measuring system and measurement time setting method|
US10890918B2|2019-04-24|2021-01-12|Innovation First, Inc.|Performance arena for robots with position location system|
US11200654B2|2019-08-14|2021-12-14|Cnh Industrial America Llc|System and method for determining field characteristics based on a displayed light pattern|
法律状态:
2020-03-31| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-11-23| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2022-01-04| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 04/08/2014, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
PCT/JP2014/070480|WO2016020970A1|2014-08-04|2014-08-04|Position self-calculation device and position self-calculation method|
[返回顶部]